专利摘要:
optical measurement method and measurement system to determine the 3D coordinates on the surface of the measurement object. the present invention relates to an optical measurement method for determining the 3D coordinates of a multiplicity of measurement points on the surface of the measurement object (s). for this purpose, the surface of the measurement object (s) is illuminated with a pattern sequence with several patterns (2a, 2b) using a projector (3), an image sequence of the surface of the measurement object (s) which is illuminated with the pattern sequence is recorded with a camera system (4), and the 3D coordinates for the measurement points are determined by evaluating the image sequence. according to the invention, while the image sequence is being recorded, at least during the lighting times of the individual images in the image sequence, the translational and / or rotational accelerations of the projector (3) of the camera system (4) and / or the measurement object (1) are measured at least at a measurement rate such that, during the illumination times of the respectively individual images of the image sequence, in each case a plurality of values, in particular a multiplicity of values , for accelerations is required. therefore, it is possible, based on the measured accelerations, to take algorithms into account, when determining the coordinates in 3d, the movements.
公开号:BR112012032875B1
申请号:R112012032875
申请日:2011-06-09
公开日:2020-02-04
发明作者:Siercks Knut
申请人:Leica Geosystems Ag;
IPC主号:
专利说明:

Invention Patent Descriptive Report for METHOD AND OPTICAL MEASUREMENT SYSTEM TO DETERMINE 3D COORDINATES FROM A MULTIPLICITY OF MEASUREMENT POINTS.
[0001] The present invention relates to an optical measurement method for determining the 3D coordinates of a multiplicity of measurement points on the surface of the measurement object and to a measurement system configured for this purpose.
[0002] Apparatus and methods of this type are used in particular in mechanical engineering, automotive engineering, in the ceramic industry, in the shoe industry, in the jewelry industry, in dental technology and in human medicine (orthopedics) and other areas, and find use, for example, for measuring and creating quality control protocol, reverse engineering, rapid prototype formation, rapid grinding or digital mock-up.
[0003] The increasing demands for a comprehensive quality control in the ongoing production process and for the digitalization of the spatial shape of prototypes means that the registration of surface topographies has become a measurement task with increasing frequency. The objective here is to determine the coordinates of individual points on the surfaces of the object to be measured in a short period of time.
[0004] Measurement systems that use image sequences, which are known in the prior art to determine the 3D coordinates of measurement objects, which can be configured, for example, as portable, manual and / or assembled systems. fixed way, in this case they usually have a standard projector to illuminate the measurement object with a pattern, and therefore they are sometimes also referred to as standard 3D projection scanners or scanners with 3D light structure. The standard
Petition 870190108144, of 10/24/2019, p. 11/54
2/31 projected onto the surface of the measurement object is registered by a camera system as an additional constituent part of the measurement system.
[0005] As part of the measurement, the projector thereby illuminates the measurement object sequentially with different patterns (for example, parallel brightness and dark stripes with different widths, the standard stripe can also in particular be rotated, for example, in 90 °). The camera (s) records the projected standard stripe at a known observation angle in relation to the projection. With each camera, an image is recorded for each projection pattern. In this way, a time sequence with different brightness values is produced for each pixel of all cameras.
[0006] In addition to the stripes, it is also possible, however, to match other patterns to be projected, such as, for example, random patterns, pseudocodes etc. Suitable patterns for this are already well known to a person skilled in the art from the prior art. Pseudocodes allow, for example, an absolute easier association of points on the object, which makes the projection of very thin stripes infinitely difficult. For this purpose, it is possible to design one or more pseudocodes in rapid succession and then one with a thin stripe, or, in successive registers, different stripe patterns which become increasingly thin in sequence, until the desired precision resolution of the measurement points on the surface of the measurement object is obtained.
[0007] The 3D coordinates of the surface of the measurement object can then be calculated from the recorded image sequence using image processing according to the methods known to the person skilled in this technique and from photogrammetry and / or stripe projection. As an example, such
Petition 870190108144, of 10/24/2019, p. 12/54
3/31 measurement methods and measurement systems are described in WO 2008/046663, DE 101 27 304 A1, DE 196 33 686 A1 or DE 10 2008 036 710 A1.
[0008] The camera system typically comprises one or more digital cameras, which are located in a known spatial position, relative to each other, during measurement. In order to guarantee a stable position for the cameras, in relation to each other, they are generally fixedly integrated, with a known positioning and spatial alignment, together in a common housing, in particular the cameras are aligned in such a way that the fields of view of the individual cameras intersect widely. In this case, two or three cameras are often used. In this case, the projector can be fixedly connected to the camera system (if separate cameras are also used only in part of the cameras available from the camera system) or it can be positioned completely separate from the camera system. [0009] The desired three-dimensional coordinates of the surface area in the general case, that is, where the positioning and relative alignment of the projector in relation to the camera system are fixed, relative to each other and, therefore, are not discovered in advance , are calculated in two steps. In a first step, the coordinates of the projector are determined as follows. At a given point in the object, the coordinates of the image on the camera are known. The projector corresponds to an inverted camera. The stripe number can be calculated from a succession of brightness values which were measured from the image sequence of each pixel in the camera. In the simplest case, it is done through a binary code (for example, a gray code), characterizes the stripe number as a discrete coordinate in the projector. In this way it is possible to obtain a higher degree of accuracy at
Petition 870190108144, of 10/24/2019, p. 13/54
4/31 which is known as the phase shift method, since it can determine a non-discrete coordinate. This can be used to supplement a gray code or as a heterodyne and absolute method of measurement.
[00010] After the position of the projector has been determined in this way or in the case where its position in relation to the camera system is already known, it is possible to check - for example, with the intersection method - 3D coordinates of the points of measurement on the surface of the measurement object as follows. The stripe number on the projector corresponds to the coordinate of the image on the camera. The stripe number specifies a plane of light in space, the image coordinate specifies a beam of light. With the position of the camera and the projector being known, the point of intersection of the plane and the straight line can be calculated. This is the desired three-dimensional coordinate of the object's point in the sensor coordinate system. The geometric position of all rays in the image must be known exactly. The beams are calculated using exactly the intersection as known from photogrammetry.
[00011] In order to obtain better precision in this measurement method for the calculation of coordinates in 3D, the non-ideal properties of systems with real lenses, which result in image distortions, can be adapted through a distortion correction and / or an accurate calibration of imaging properties may occur. All imaging properties of the projector and cameras can be measured along the calibration processes known to a person skilled in the art (for example, a series of calibration records), and a mathematical model to describe these design properties. can be generated from them (using photogrammetric methods - in particular a calculation of agglomerating equalization
Petition 870190108144, of 10/24/2019, p. 14/54
5/31
- the parameters that define the imaging properties are determined from the series of calibration records, for example). [00012] In short, in the standard projection method or in scanners with 3D light structure, the illumination of the object with the sequence of light patterns is thus necessary to allow a deep and unambiguous determination of the measurement points in the region measurement with the aid of triangulation (intersection). Thus, in general, a plurality of records (ie, a series of images) by illuminating the measurement object with different projections of the corresponding pattern (ie, with a corresponding series of patterns) is necessary to ensure a high enough accuracy in relation to the measurement result. In manual systems known from the prior art, such as, for example, in the measuring device described in WO 2008/046663, the lighting sequence must, in this case, occur so quickly that a movement made by the operator during the recording of a series of images does not cause measurement errors. The pixels recorded by the cameras of the individual projection must be capable of being designated, in relation to each other with sufficient precision. Thus, the image sequence must occur faster than the change in pattern or image caused by the operator. Since an optical energy emitted from the projector is limited by the available optical sources and radiation protection regulations, this results in the limitation of the detectable energy in the camera system and thus in a measurement limitation on weakly reflecting surfaces of the measurement object . In addition, projectors are limited in terms of projection speed (image rate). Typical maximum image rates for such projectors are, for example, around 60 Hz.
[00013] For the measurement operation that comprises the projection of a series of patterns and records of an image sequence of
Petition 870190108144, of 10/24/2019, p. 15/54
6/31 respective standards with the camera system, for example, the measurement duration of approximately 200 ms is required with conventional measuring devices (an example: to record sequences from 8 to 10 images with an exposure duration of 20 ms to 40 ms per image, for example, can result in total recording time or measurement durations between 160 ms and 400 ms per measurement position).
[00014] In the case of insufficient stability or in the case of insufficiently high position and stability in the alignment of the projector's camera composition (or, if appropriate, of a metering head that contains the composition of the camera and the projector in an integrated manner) and of the measurement object in relation to each other, during the measurement operation (in the measurement position), several undesirable effects can occur, which make the assessment more difficult, more complicated and even impossible, or the effects that affect in a way adverse to attainable accuracy.
[00015] An unsatisfactory oscillation of the composition of the camera, the projector (or, if appropriate, a metering head that contains the composition of the camera and the projector in an integrated way) or the measurement object can have, in this case, several causes.
[00016] Firstly, vibrations in the measurement environment (for example, if measurements are carried out at a production station integrated in a production line), they can be transferred to the support of the measurement object or to a measuring arm. robot that holds the measuring head and thus results in uncomfortable oscillations. Therefore, measures which are complicated to dampen the oscillation have been quite necessary, or it is necessary to move to specific measurement spaces, which, however, make the production process significantly more complicated (since the removal of the object of measurement from within
Petition 870190108144, of 10/24/2019, p. 16/54
7/31 of the production line and its transport to the measuring space that was specifically designed for it become necessary).
[00017] In manual systems, the main cause of unsatisfactory oscillation is, in particular, the natural tremor in the user's hand.
[00018] The negative effects that will be mentioned here, which can be caused by the lack of stability in the position and orientation of the composition of the camera, the projector and the measurement object in relation to each other, are, first of all, blur due motion and / or balance of the camera in individual images recorded by an image sequence.
[00019] Secondly, however, non-conformities of the individual images of an image sequence in relation to each other and in relation to their respective registration positions and directions from the measurement object (that is, the variability in positions and registration directions for individual images within an image sequence) can occur, such that the respective association of pixels in individual images with identical measurement points on the surface of the measurement object becomes completely impossible or may become possible only with the use of extremely high computational complexity and the inclusion of information from a multiplicity of images from the same region of the surface of the measurement object (that is, it may be necessary to subsequently take the individual images into a spatial relationship in a computational way, which is a lot of work, and that's why so far, for primarily as a preventive measure against this effect, an excess of images per image sequence has been recorded, which mainly serves only to calculate the spatial relationship of the positions and directions of registration of the individual images).
Petition 870190108144, of 10/24/2019, p. 17/54
8/31 [00020] In order to expand the measurement region on the measurement object (for example, to measure an object in its entirety), it is often necessary to have a plurality of measurements in succession (from various measurement positions and at different camera viewing angles in relation to the measurement object), in which the results of various measurements are subsequently linked to each other. This can occur, for example, by capturing the regions that are selected in each case and in an overlapping way in the respective measurement operations and with the respective overlay being used to correspondingly gather the 3D coordinates obtained from various measurement operations (that is, point clouds) (that is, identical or similar distributions in point clouds, determined in the individual measurement operations, can be identified and, consequently, point clouds can be combined).
[00021] This meeting operation, however, is in general extremely intensive in terms of calculation and requires an expense that is significant and uncomfortably high in terms of time and energy even if the best power processors are available. When, for example, a robot arm is used to hold and guide the measurement head, the reduction of computational expense that is required for the meeting operation can thus be achieved by capturing the recording positions and directions in the individual measurements based on the respective position of the robot arm and using them for the meeting as background information (for example, as bounding conditions).
[00022] The disadvantages in this case are the relatively low accuracy with which the measurement position can be determined based on the position of the robot arm, and - nevertheless - the requirement that such a robot arm be present. Thus, the computational power
Petition 870190108144, of 10/24/2019, p. 18/54
9/31 required to gather measurement results from a plurality of measurement operations cannot be reduced in this way for manual measurement systems.
[00023] Other disadvantages of prior art systems that use substantially coherent optical radiation for standard illumination are - due to the unwanted stained fields, which occur in the respective patterns of the pattern sequence - local measurement inaccuracies or gaps at the measurement.
[00024] The technical objective on which the invention is based is, therefore, the provision of an improved method of optical measurement and a measurement system, using image sequences to determine the 3D coordinates on the surface of the measurement object , in particular, in which one or more of the aforementioned disadvantages can be reduced or eliminated.
[00025] More specific objectives of the invention are, in this case, the possibility of a more precise determination of the coordinates in 3D, also in the case of the position stability of the projector, the camera system and / or the measurement object, that is, not satisfactory for measurement systems known to the prior art (eg due to unwanted oscillations, vibrations or tremors) during the measurement operation (ie, during the projection pattern sequence and the image registration sequence). Specifically, the intention here is to allow the reduction - in the first place - of errors or inaccuracies in determining 3D coordinates, which are caused by camera shake and / or blur due to motion in the individual images of an image sequence . Second, the goal is also to be able to reduce or eliminate errors which are caused by the variability in the registration position and registration direction in the images of an image sequence, in relation to each other, which occurs in the case of oscillation. .
Petition 870190108144, of 10/24/2019, p. 19/54
10/31 [00026] Another specific objective when using a coherent source for the projection of standards is the reduction of local measurement gaps or local measurement inaccuracies, caused by the spots that occur on the standard surface of the measurement object.
[00027] Another specific objective - in particular for manual measurement systems - is to simplify the collection of measurement results (for example, point clouds produced from them) of a plurality of measurement operations and / or enabling a reduction in the computational power required for such a meeting.
[00028] The invention relates to a method of measuring pattern projection that uses image sequences to determine the 3D coordinates of the surface of the measurement object and to a measurement system which is configured for said purpose.
[00029] Within the context of the invention, during the measurement operation - that is, during the recording of the image sequence, translational and / or rotational accelerations of the projector pattern, of the camera system (for example, if appropriate, of a measurement containing the composition of the camera and the projector in an integrated way) and / or the measurement object are measured using inertia sensors, and the measured accelerations are taken into account when determining 3D coordinates.
[00030] In a more detailed description, according to the invention, during the exposure time of individual images of the image sequence, translational and / or rotational accelerations are measured during the recording of the image sequence at least at such a rate of measurement that during the exposure time of the respectively individual images of the image sequence, in each case a plurality of values, in particular a multiplicity of values, is captured for the accelerations. Based on this, according to the
Petition 870190108144, of 10/24/2019, p. 20/54
11/31 invention, the movements of the projector, the camera system and / or the measurement object, which occur during the exposure time of the respective individual images of the image sequence and which thus cause the camera to balance and / or blur due to motion in the respective individual images of the image sequence, are then taken into account algorithmically based on the measured accelerations in determining the 3D coordinates. [00031] In particular, regardless of the measured accelerations, the compensation and / or correction of the camera's balance and / or blur due to motion, which are caused by the movements of the projector, the camera system and / or the measurement object that occur during the exposure time of the respective individual images in the image sequence, happen respectively in the individual images of the image sequence.
[00032] For this purpose, the inertia sensors can be arranged on the camera system, on the projector and / or on the measurement object, in which the inertia sensors are configured in particular together with an integrated measurement unit inertia. [00033] Depending on the mode variant of the camera system and the projector, the inertia sensors can also be, in this case, properly integrated into a housing that contains components of the camera system and / or the projector. The camera system (also referred to as camera composition) can be configured, for example - as is already known through the prior art - from one, two, three, four or more cameras, which are arranged with a positioning and fixed and known orientation, relative to each other, in a common housing and are configured for the substantially simultaneous registration of individual images. Alternatively, the individual cameras in the camera composition can also be configured to be
Petition 870190108144, of 10/24/2019, p. 21/54
12/31 physically separated from each other in each case with a dedicated housing, which, however, in general makes the evaluation of image sequences more difficult, since in this case the relative spatial relationship of the cameras one in relation to one does not it is predefined (which in a normal case results in a higher computational expense for the evaluation of image sequences). In addition, in the case of cameras physically separated from manual systems, there is a difficulty during use due to the fact that a plurality of separate equipment must be loaded and insured. For these two reasons, the camera system can be - in particular in manual systems or systems configured to be mounted on a robot arm - accommodated together with the projector having a fixed and known positioning and orientation relative to each other. and physically in a common measuring head of the measuring system, in which, in this case and according to the invention, the inertia sensors or the inertial measuring unit can also be arranged.
[00034] Likewise, a group of inertia sensors can also be - alternatively or additionally - configured to be mounted on the object to be measured, which group transmits the measured accelerations (or the movements that are already derived from the same or even positions and alignments) for the evaluation unit of the measurement system for the purpose of being considered in determining 3D coordinate.
[00035] Specifically, here the inertia sensors are combined and integrated within an inertia measurement unit based on MEMS-based components (MEMS meaning electromechanical microsystem) in such a way that the inertia measurement unit is configured for measure all accelerations in six degrees of freedom, in particular, at a measurement rate, for
Petition 870190108144, of 10/24/2019, p. 22/54
13/31 example, between approximately 50 and 2000 Hz.
[00036] As is known to a person skilled in the art, here the accelerations of the six degrees of freedom can be measured as a rule using the following types of sensors through the corresponding combination of a plurality of inertia sensors in a unit of inertia measurement (IMU):
[00037] Three orthogonal acceleration sensors (also referred to as translation sensors) detect linear acceleration on the geometric axis x, y or z. Based on this, the translation movement (and the relative position) can be calculated. Three rotation rate sensors, arranged in an orthogonal manner (also referred to as gyroscopic sensors) measure angular acceleration around the geometric axis x, y or z. Based on this, the rotational movement (and the relative alignment) can be calculated.
[00038] Such units of inertia measurement, based on MEMS-based components, and which are configured as miniaturized equipment or assemblies, are already well known through the prior art and have been produced for a long period and in large-scale production. scale.
[00039] The accelerations of the camera system, the projector and / or the measurement object detected during the measurement operation - or additionally also between a plurality of measurement operations - can be used here according to the invention throughout evaluation (for example, when determining the 3D coordinates of the measurement points from the image sequence or along the collection of measurement results obtained from a plurality of measurement operations, that is, from a plurality of image sequences) for various purposes and to enhance various aspects.
[00040] If - as provided in the context of the invention - during
Petition 870190108144, of 10/24/2019, p. 23/54
14/31 the exposure time of the individual images in an image sequence, the accelerations are at a high enough rate (that is, a rate that provides at least a few - for example, between 5 and 50 - acceleration values per duration exposure of an individual image), the movements of the projector, the camera system and / or the measurement object during the exposure time of the individual images in the image sequence, which movements cause the camera to shake and / or blur due to to motion, can be taken into account algorithmically using these measured acceleration values. The measured acceleration values can preferably be used - according to methods, for example, known enough from photography - to compensate or correct the camera's balance and / or blur due to motion in the individual images of an image sequence .
[00041] A lack of stability in the position and orientation of the measurement system and measurement object in relation to each other during the exposure time of the images can be caused, in this case, for example, by a user holding the projector, the camera system and / or the measurement object, in particular caused by an inadvertent hand shake or by vibrations or oscillations in the projector supports of the camera system and / or the measurement object.
[00042] The movement in the space captured according to the invention can therefore be used, for example, to correct the blurred effect in the individual records of digital cameras or in triangulation scanners. In addition, it is possible in manual measurement systems to eliminate or at least reduce the main limitations caused by a user's hand shake.
[00043] Annoying oscillations / vibrations during measurements, which often occur during stationary mounting of the
Petition 870190108144, of 10/24/2019, p. 24/54
15/31 measurement systems can also be thus and according to the invention - as described above - taken into account when evaluating and determining 3D coordinates. As a result, it is possible, for example, to waive oscillation damping measures which are often very complicated in terms of construction (or at least measures that are significantly less complex can be provided), with a simultaneous simultaneous improvement in accuracy coordinate determination in 3D or at least without having to accept losses in terms of accuracy. The measurement system according to the invention is therefore more suitable for direct use on a production line. And in general, it is possible - due to the invention - to dispense with the offline operation of the measurement system in a special measurement space, which is - seen in general - very complicated (and includes the removal of the measurement object from within the production line and transporting it into the measurement space for which it was consequently designed).
[00044] According to a development, the captured accelerations of the measurement system components (ie, the composition of the camera / projector) and / or the measurement object can also be used according to the invention to associate in each case an image recording position and the direction in relation to the measurement object, which is current in the respective recording time, with the respective images of an image sequence. To this end, accelerations are captured during an entire measurement operation (that is, the entire operation of registering an image sequence or a plurality of image sequences). In particular, for this purpose, the accelerations can be measured at such a rate that an exact enough association of the respective recording times of the individual images becomes possible. If the accelerations
Petition 870190108144, of 10/24/2019, p. 25/54
16/31 are captured at a rate significantly higher than the rate at which the images are sequentially recorded, it is also possible to make the association with the respective images, as well as with those registration image positions and directions that result from an average calculation of the accelerations captured during the exposure times of the individual images.
[00045] The respective register positions and directions associated with the individual images in an image sequence for simplified pixel linking in individual images with identical measurement points on the surface of the measurement object, also in the case where - for example, due to a tremor - the variability in the recording positions and directions of the individual images occurs within an image sequence.
[00046] The computational expense for the evaluation of an image sequence (which is particularly necessary for the method intersection) can thus be, and according to the invention, significantly reduced since, derived from the positions and directions of registration, it is possible to determine in advance by which pixels of the respective images of the image sequence the respectively identical points of measurement of the surface will have their images generated.
[00047] Since in the prior art systems partly an excess of images per image sequence had to be recorded in order to take - prior to the actual evaluation - the images into a spatial relationship through image processing, and it is now possible according to the invention to reduce the computational expense and / or the number of images needed to be recorded per image sequence for the comparatively accurate determination of the 3D coordinates of this image sequence.
Petition 870190108144, of 10/24/2019, p. 26/54
17/31 [00048] On the other hand, however, the invention also makes it possible for images in an image sequence to be recorded at longer measurement intervals, since the influence exerted by the oscillation of the measurement components during the recording of the The series of images is, according to the invention, sufficiently well compensable or correctable and does not lead to measurement errors.
[00049] For example, in order to measure larger regions of the surface of the measurement object, which go beyond the standard viewing and projection region of the camera and projector composition, it is necessary according to the prior art that a plurality of measurement operations are performed and that the results (for example, point clouds) are subsequently brought together (“stitched”) based on the partial regions, which are measured in an overlapping manner.
[00050] According to the invention, it is now also possible, for example, to perform a relatively long measurement operation (for example, up to 1-5 seconds or longer with continuously sequential image recording), in which the projector and the camera system is deliberately moved in such a way that the entire measuring region is covered (for example, the entire measuring object on all sides). The speed of movement and the succession of the projected patterns and the frequency of the image registration must in this case be adapted and configured in such a way that all the partial regions are illuminated with patterns that vary sufficiently for evaluation, and so that a sufficient number of images is recorded on them.
[00051] Instead of a plurality of individual measurement operations with a position and stability in the alignment of the composition of the camera and the projector that is as still as possible, during the respective individual measurements whose measurement results are subsequently linked in a way
Petition 870190108144, of 10/24/2019, p. 27/54
18/31 computationally intensive, it is, therefore, possible according to the invention that a continuous measurement operation is carried out in a sliding way, and takes the individual images in relation to their respective register positions and directions - for evaluation of the sequence of image - within a spatial relationship, one in relation to the other, based on the accelerations measured in accordance, however, with the invention.
[00052] In an additional embodiment of the invention, however, it is also possible for individual measurement operations to be carried out, whose measurement results (for example, point clouds) can be gathered ("stitched") according to the invention using measured accelerations - with reduced computational expense. In this case, it is totally unnecessary for the specific partial regions to be measured in an overlapping manner to allow in principle the collection of the corresponding measurement results. The overlapping measurement can be, however and nevertheless carried out in this way to further increase the reliability and accuracy of crossing individual measurement results also in the context of the invention. The measurement positions and directions - which can be derived based on accelerations - in the respective individual measurement operations can also be used, for example, in order to provide an improved initial value for the computational connection of the point clouds, if the they refer to identical points (control points) or identical patterns / geometries (in regions with overlap).
[00053] According to a further aspect of the invention, the registration positions and directions associated with the respective images, registration positions and directions which are derived from the measured accelerations, can also be used to densify the measurement region (in such a way). so that the 3D coordinates are
Petition 870190108144, of 10/24/2019, p. 28/54
19/31 determined within a specific measurement region for a greater number of measurement points). For this purpose, it is possible, for example, with a slight deliberate movement of the projector, the composition of the camera and / or the measurement object (for example, caused by the natural tremor of a user's hand) that a series of patterns is projected over the surface region and that a series of images is recorded on the same (or that a plurality of series from the same measurement region is connected), and - as a form of accelerations - so that the images are arranged in a spatial relationship with each other the others in a highly precise manner, such that as a result, the 3D coordinates can be determined within the measurement region with a higher density of the measurement point. In particular, it is possible as a result that the 3D coordinates of the measurement points are determined on the surface of the measurement object, for example, also in the sub-pixel region of the individual images.
[00054] In a similar way it is also possible according to the invention for the systems, which substantially use a coherent optical radiation for standard lighting, to reduce the negative influence, caused by the inadvertent occurrence of stained fields in the respective standards of the pattern sequence, on the measurement results (for example, local measurement inaccuracies or gaps in the measurement point). If, for example, the projector, the composition of the camera and / or the measurement object are slightly moved in a deliberate manner during the measurement operation (for example, due to the natural tremor of a user's hand), the stained fields the patterns projected on the surface of the measurement object will also change. As a result, it is therefore possible that the stained fields in the respective images do not always appear in identical locations on the surface of the measurement object. Therefore, in the context of
Petition 870190108144, of 10/24/2019, p. 29/54
20/31 an image sequence which is recorded in this way, little or no location should exist without image formation on at least some of the images of the image sequence in a lighted manner with a substantially smudge-free pattern. By juxtaposing the images in the image sequence (in which the images are again arranged in spatial relation with each other according to the invention and using the accelerations measured here), it is therefore possible to reduce the negative influence caused by the smear in the case of standard projection with coherent optical radiation.
[00055] In short, it is therefore possible, according to the invention, to deliberately move and for various purposes during the measurement operation (the pattern-projection sequence and the image registration sequence) the measurement object, the measurement system camera and / or the projector (or to reduce or eliminate the negative influence caused by an actually unwanted movement). For this purpose, the accelerations of the projector, the composition of the camera and / or the measurement object are measured using the inertia sensors and taken into account when evaluating the individual images in the image sequence.
[00056] In particular, it is possible according to the invention that compensation or correction of influences caused by movements during the exposure time of individual images (camera shake / blur due to motion) in the image is performed for each image separately using measured accelerations.
[00057] In an additional or alternative way, it is, however, also possible that a position and register direction respectively current and derived from accelerations (if appropriate averaged), relative to the measurement object (and if appropriate, current position and default projection direction) is associated with each image and thus takes the images into an advancing spatial relationship to determine the 3D coordinates that
Petition 870190108144, of 10/24/2019, p. 30/54
21/31 result from the image sequence. It is possible, for example, in this case that a deliberate movement of the measurement object, the camera system and / or the projector is carried out:
- to enlarge the measurement region on the surface of the measurement object,
- to densify the measurement region and thereby increase the density at the measurement point on the surface of the measurement object and / or
- to change the stained fields, which appear inadvertently in the case of lighting with substantially coherent optical radiation, in the respective patterns of the pattern sequence and thereby to reduce local measurement inaccuracies or gaps in the measurement point, caused by such fields with stain. [00058] In a specific way it is possible in this that the movement made for these purposes is caused by a user who holds in his hand the measurement object and / or the camera system, and / or a support which is designed for the same and is controlled manually or in an automatically preprogrammed way - in particular a robot arm - for the projector, the camera system and / or the measurement object.
[00059] Again, in an additional or alternative way, it is possible during the successive execution of a plurality of individual measurement operations, however, that the measurement and alignment position (of the composition of the camera, the projector and the measurement object a relative to the other) is associated with each measurement operation and so that the gathering of the results of a plurality of individual measurement operations is simplified or made possible in the first place.
[00060] The method according to the invention and the apparatus according to the invention will be described in more detail below with reference
Petition 870190108144, of 10/24/2019, p. 31/54
22/31 to the exemplary and concrete modalities illustrated schematically in the drawings in a purely exemplary manner, in which additional advantages of the invention will also be mentioned. In the figures:
Figure 1 shows an optical measurement system for determining 3D coordinates, in which an inertia measurement unit (IMU) is integrated according to the invention within the manual measurement head;
figure 2 shows an optical measurement system according to the invention with a manual measurement head that has an IMU, a projector and three cameras, in which a car door as the measurement object is lit with a pattern during the 3D coordinate determination;
figures 3 and 4 show an optical measurement system according to the invention with a manual measurement head that has an IMU, a projector and a camera, in which a car door as the measurement object is successively lit with patterns that they have different degrees of quality;
figures 5 and 6 show an optical measuring system according to the invention with a measuring head attached by a robot arm, in which a car door like the measuring object is successively lit with stripe patterns that have different degrees Of Quality;
figure 7 shows an optical measurement system according to the invention with a manual measurement head, in which the oscillation caused by a hand shake during the measurement is illustrated;
figure 8 shows an optical measurement system according to the invention with an IMU on the measurement object, in which the measurement object is arranged in positions
Petition 870190108144, of 10/24/2019, p. 32/54
23/31 different to enlarge the measurement region and the images recorded in the different positions are crossed based on the measured accelerations;
Figure 9 shows an optical measurement system according to the invention with a manual measurement head that has an IMU, in which the measurement head is arranged in different positions to enlarge the measurement region. Images recorded in different positions are crossed based on measured accelerations; and figure 10 shows an optical measurement system according to the invention and in use on a production line, in which the vibrations, which affect the measurements the measurement system according to the invention and which are transferred from neighboring production station, are compensated based on the measured accelerations.
[00061] The optical measurement system 7 shown in figure 1 to determine the 3D coordinates of a multiplicity of measurement points on the surface of the measurement object 1s has, according to the invention, a projector 3, a camera system 4 , an evaluation unit 6 and inertia sensors 5a integrated in an inertia measurement unit (IMU).
[00062] Projector 3 is, in this case, configured for illuminating the surface of the measurement object 1s with the pattern sequence having different optical patterns 2a. For example, the projector pattern 3 can be configured similar to the principle of a slide projector. However, it is also possible that other projection techniques are used to generate 2a light patterns, for example, programmable LCD projectors, movable glass slides with different grid structures in a projector, the combination of an electrically switchable grid and a mobile mechanical device or a projection of
Petition 870190108144, of 10/24/2019, p. 33/54
24/31 individual grids based on the glass slides.
[00063] The camera system 4 is configured to register an image sequence of the surface of the 1s measurement object illuminated with the pattern sequence and can have, in this case, at least one camera, in particular two, three or four cameras 4a, 4b, 4c, which can be arranged, for example, with a fixed and known positioning and orientation, in relation to each other, and are configured in a specific way for the substantially simultaneous recording of individual images.
[00064] As is known to a person skilled in the art, it is possible to use for image recording, for example, cameras 4a, 4b, 4c with electronic image sensor, for example, CCD or CMOS sensors, which provide information about the image in the form of an image matrix for further processing. It is possible, in this case, that both monochrome and color cameras are used.
[00065] The evaluation unit 6 is configured for the determination of the 3D coordinates of the measurement points from the image sequence, in particular, and during the verification of a succession of luminance values for identical measurement points on the surface of the 1s measurement object in the respective images of the registered image sequence.
[00066] The projector 3 and the camera system 4 are physically accommodated according to an exemplary mode with a fixed and known positioning and orientation, relative to each other, in a common measuring head 8 of the measuring system 7, in particular, in which the measuring head 8 is configured to be able to be held by the user's hand and / or to be attached to a robot arm.
[00067] According to the invention, the IMU, which has the sensors
Petition 870190108144, of 10/24/2019, p. 34/54
25/31 inertia 5a, is also integrated within the measuring head 8, in which the inertia sensors 5a are configured in this way to measure the translational and rotational accelerations of the measuring head 8 (that is, of the projector 3 and the camera system 4) during image sequence recording. Inertia sensors 5a are configured, in this case, for the measurement of accelerations in at least such a measurement rate, which during the exposure time of the respective individual images of the image sequence, in each case a plurality of values, in particular a multiplicity of values for the accelerations is captivable.
[00068] The evaluation unit 6 is configured, in this case, in such a way that with it the synchronized control of the inertia sensors 5a and the camera system 4 is carried out in such a way that during the recording of the image sequence in each case , a plurality of values for the accelerations is captured at least during the exposure time of individual images in the image sequence. [00069] Therefore, it is finally possible and according to the invention that uses the evaluation unit 6 to algorithmically take into account the movements of the projector 3, the camera system 4 and / or the measurement object 1 which cause the camera balance and / or blur due to motion in the respective individual images of the image sequence, based on the accelerations measured by the inertia sensors 5a for determining the 3D coordinates. [00070] In this particular case it is possible, based on the accelerations measured for the compensation or correction of effects caused by movements during the exposure time of the individual images (camera balance / blur due to motion) in the image, occur for each image separately.
[00071] For this purpose, the evaluation unit 6 can be configured in such a way that regardless of the measured accelerations, the
Petition 870190108144, of 10/24/2019, p. 35/54
26/31 compensation and / or correction of camera balance and / or blur due to motion - caused by movements of the projector 3, the camera system 4 and / or the measurement object 1 and that happen during the exposure time of the respective individual images in the image sequence - in each case in the individual images in the image sequence.
[00072] In short, accelerations are captured during the exposure time of individual images in an image sequence at a rate high enough (that is, at a rate that provides at least some - for example, between 5 and 50 - acceleration values for duration of exposure of an individual image), and based on this it is then possible to algorithmically take into account the movements of the projector, the camera system and / or the measurement object during the exposure time of the individual images of the image sequence, movements that cause the camera to shake and / or blur due to motion in the image, based on these measured acceleration values. The measured acceleration values can be preferred and according to methods that are well known, for example, in photography, used to compensate or correct the camera balance and / or blur due to motion in the individual images of a sequence of Image.
[00073] The inertia sensors 5a of the inertia measurement unit can here in particular be based on the MEMS-based components and be combined and integrated within an IMU in such a way that it is configured for the measurement of accelerations all in six degrees of freedom, in particular at a measurement rate between approximately 50 and 2000 Hz.
[00074] In this way, it is possible that the illustrated optical measurement system 7, in particular automatically controlled by the evaluation unit 6 and in a programmed way, is configured and designed
Petition 870190108144, of 10/24/2019, p. 36/54
27/31 to carry out the optical measurement method according to the invention as already described above.
[00075] The exemplary modality of an optical measurement system 7 shown in figure 2 and according to the invention has a manual measurement head 8 comprising an IMU (with inertia sensors 5a), a projector 3 and three cameras 4a, 4b, 4c (for example, integrated in a portable housing with handle and therefore configured as a portable 3D scanner with light structure), in which a car door as the measuring object 1 is illuminated with a pattern 2a ( as part of the pattern sequence) using projector 3 in the course of determining 3D coordinates.
[00076] The three cameras 4a, 4b, 4c of the camera system 4, which are arranged here as an example with a fixed and known positioning and orientation, relative to each other, are configured to record an image sequence of the car door surface that is lit with the pattern sequence. In this case, cameras 4a, 4b, 4c can be configured for substantially simultaneous recording of individual images.
[00077] In addition, an inertia measurement unit (with inertia sensors 5a) is strongly integrated within the measuring head 8, as a result of which the compensation of measurement errors according to the invention caused, for example, the oscillation related to hand tremors can be performed during the evaluation of the image sequence and the derivation of coordinates in 3D. In particular, automatically controlled by the evaluation unit and in a programmed way, it is possible that the illustrated optical measurement system 7 is configured and designed to carry out the optical measurement method according to the invention as described above.
[00078] Figures 3 and 4 illustrate an optical measurement system 7 of
Petition 870190108144, of 10/24/2019, p. 37/54
28/31 according to the invention with a manual measurement head 8 that has an IMU (with inertia sensors 5a), a projector 3 and a camera 4a (for example, integrated in a portable housing with handle and, therefore, configured as a portable 3D scanner with a light structure), in which a car door as the measuring object 1 is illuminated successively with patterns 2a, 2b which have different degrees of quality as part of the pattern sequence (figure 3: more rustic pattern 2a; and figure 4: with finely structured pattern 2b).
[00079] As is known from the prior art, the object (eg car door) is illuminated in this way with a sequence of light patterns 2a, 2b with a different quality structure in order to obtain an unambiguous determination of depth of the measurement points in the measurement region with the aid of triangulation (intersection). In this case, a plurality of images are also recorded (ie a series of images) by illuminating the measuring object 1 with the corresponding different patterns 2a, 2b (ie, with a series of patterns).
[00080] According to the invention, once again an inertia measurement unit (with inertia sensors 5a) is integrated within the measuring head 8 of the 3D scanner illustrated in figures 3 and 4, as a result of which compensation of, for example, measurement errors caused by oscillation related to hand tremors can be carried out according to the invention during the evaluation of the image sequence and the derivation of the coordinates in 3D. In particular, automatically controlled by the evaluation unit and in a programmed manner, the illustrated optical measurement system 7 can be configured and designed to carry out an individual modality or a plurality of modalities described above of the optical measurement method according to the invention.
Petition 870190108144, of 10/24/2019, p. 38/54
29/31 [00081] Figures 5 and 6 illustrate an optical measurement system 7 according to the invention and similar to the system in figures 3 and 4, except in this case that the measurement head 8 is configured as a head measurement 8 attached by a robot arm and the projector 3 is configured for the successive projection of the striped patterns 2a, 2b which have different qualities as a pattern sequence.
[00082] According to the invention, the measuring head 8 shown in figures 3 and 4 also has an IMU inertia measurement unit (with inertia sensors 5a), as a result of which the measurement error compensation according to with the invention, which are caused, for example, by the vibrations transferred to the robot arm from the environment of the measurement region, can be carried out during the evaluation of the image sequence and the derivation of the coordinates in 3D. Alternatively or additionally, the measured accelerations can also be used to spatially gather (“sew”) individual images captured from different positions of the robot arm (as part of one or more image sequences), in such a way that - depending on the choice of different record positions, which can be adapted by a person skilled in the art according to the requirements - the measurement region can thus be enlarged and / or densified, or a change of field with spots which occur inadvertently in the case of lighting with an optical radiation substantially coherent in the respective patterns 2a, 2b of the pattern sequence and thus the reduction of inaccuracies in the local measurement or gaps in the measurement point, caused by such stained fields can be effected .
[00083] Figures 7 and 9 show a similar measurement system 7 as in figure 1 and illustrate here an oscillation / movement of the head
Petition 870190108144, of 10/24/2019, p. 39/54
Measurement 30/31 8 (caused inadvertently by hand tremor or for example, for the purpose of densifying or enlarging the measurement region - deliberately) during the measurement. The measured accelerations using the IMU (with inertia sensors 5a) can then be used to spatially gather (“sew”) individual images captured from different manual positions (as part of one or more image sequences).
[00084] Furthermore, it is possible in this case that a camera balance with individual pattern projections 2a, 2b (of the pattern sequence) occurs on the measurement object 1 and that a camera balance with individual image records (of a sequence also occur, where errors caused by the camera balance in the image can be corrected or compensated in the same way, or taken into account according to the invention during the determination of the coordinates in 3D using the accelerations measured by the IMU integrated into the measuring head 8.
[00085] Figure 8 illustrates an optical measurement system 7 according to the invention that has an IMU (with inertia sensors 5b) arranged on the measuring object 1, in which the measuring object 1 can be arranged in different positions to enlarge the measurement region and - according to the invention - the different positions of the measurement object 1 recorded in the individual images (of the image sequence) can be crossed based on the accelerations measured by the IMU and arranged in a spatial relationship with each other the others.
[00086] In addition to the IMU (with inertia sensors 5b) on the measuring object 1, it is also also possible for an IMU (with inertia sensors 5a) to be integrated inside the measuring head
8. As a result, it is possible - as described above - in addition, that the movements of the measuring head 8 that occur
Petition 870190108144, of 10/24/2019, p. 40/54
31/31 during measurement are also taken into account according to the invention when determining depth information and 3D coordinates.
[00087] Figure 10 shows an optical measurement system in use 7 according to the invention on a production line, in which the vibrations that affect the measurements with the measurement system 7 according to the invention, which vibrations are transferred from the neighboring production station, are compensated based on the measured accelerations.
[00088] For this purpose, in each case an IMU (with inertia sensors 5a and / or 5b) can be arranged according to the invention, as also described, for example, above in conjunction with figure 8 both on the object measurement 1 and integrated into the measurement head 8 (which in this case has two cameras purely as an example), as a result of which the compensation of measurement errors according to the invention, which are caused, for example, by vibrations transferred to the robot arm from the region of the measurement environment and by the oscillation of the measuring head 8, it can be carried out during the evaluation of the image sequence and the derivation of the coordinates in 3D.
[00089] It can be understood that these illustrated figures schematically illustrate only the possible exemplary modalities. The different approaches can be combined in the same way with each other and with prior art methods.
权利要求:
Claims (14)
[1]
1/7
1. Optical measurement method to determine the 3D coordinates of a multiplicity of measurement points on the surface of the measurement object (1s), which comprises the steps of,
- illuminate the surface of the measurement object (1s) with the pattern sequence of different patterns (2a, 2b) using a projector (3),
- recording an image sequence of a plurality of individual images of the surface of the measurement object (1s), which is illuminated with the pattern sequence, using a camera system (4), and
- determine the 3D coordinates of the measurement points by evaluating the image sequence, in particular in which a succession of luminance values is checked for identical measurement points on the surface of the measurement object (1s) in the respective images of the image sequence, characterized by the fact that,
- during the recording of the image sequence at least during the exposure time of individual images of the image sequence, translational and / or rotational accelerations:
- the projector (3),
- the camera system (4) and / or
- of the measurement object (1) are measured in at least such a measurement rate that during the exposure time of the respectively individual images of the image sequence, in each case a plurality of values, in particular a multiplicity of values, for the accelerations is captured,
- the movements of the projector (3), the camera system (4)
Petition 870190108144, of 10/24/2019, p. 42/54
[2]
2/7 and / or the measurement object (1), which cause camera shake and / or blur due to motion in the respective individual images in the image sequence and occur during the exposure time of the respective individual images in the image sequence image, are algorithmically taken into account when determining 3D coordinates based on the measured accelerations, and
- during the entire operation of registering the image sequence or a plurality of image sequences, the accelerations are measured and the information obtained by evaluating the individual images in relation to the 3D coordinates of the measurement points is gathered in a computational way using measured accelerations,
2. Optical measurement method according to claim 1, characterized by the fact that the accelerations of the projector (3) of the camera system (4) and / or the measurement object (1) are all measured in six degrees of freedom and the accelerations are measured continuously at a specific measurement rate, in particular between approximately 50 and 2000 Hz, in a specific manner throughout the operation of recording the image sequence.
[3]
3. Optical measurement method according to claim 1 or 2, characterized by the fact that, regardless of the measured accelerations, the compensation and / or correction of the camera's balance and / or blur due to motion, which is caused by movements of the projector (3), the camera system (4) and / or the measurement object (1) which occur during the exposure time of the respective individual images in the image sequence, happen respectively in the individual images of the image sequence , in which the movements are specifically caused:
- by a user holding the projector (3), the camera system (4) and / or the measurement object (1) in his hand, in particular,
Petition 870190108144, of 10/24/2019, p. 43/54
3/7 caused by a hand tremor and inadvertently, or
- vibrations or oscillations in the supports of the projector (3), the camera system (4) and / or the measurement object (1).
[4]
4. Optical measurement method according to any one of claims 1 to 3, characterized by the fact that during the registration operation,
- to enlarge the measurement region on the surface of the measurement object (1s),
- to densify the measurement region and thereby increase a density at the measurement point on the surface of the measurement object (1s) and / or
- to change the stained fields, which occurs inadvertently in the case of lighting with substantially coherent optical radiation, in the respective patterns (2a, 2b) of the pattern sequence and thereby reduce local measurement inaccuracies or gaps at the point of measurement caused by such stained fields, the measurement object (1), the camera system (4) and / or the projector (3) is moved in a specific way in which the movement that is carried out for this purpose is carried out by
- a user holding the measurement object (1) and / or the camera system (4) in his hand, and / or:
- a support that is designed for it and that is controlled manually or in an automatically preprogrammed way - in particular a robot arm - for the projector (3), the camera system (4) and / or the measurement object ( 1).
[5]
5. Optical measurement method according to any one of claims 1 to 4, characterized by the fact that the computational junction of the spatial relations between the individual recorded images, one in relation to the other and referring to their position of
Petition 870190108144, of 10/24/2019, p. 44/54
4/7 records and directions in relation to the measurement object (1), relations that are derived from the measured accelerations, are used as initial conditions in such a way that the computational junction per se requires a reduced computational expense - in relation to a method where such initial conditions are not used.
[6]
6. Optical measurement method according to any one of claims 1 to 5, characterized by the fact that the 3D coordinates of the measurement points are determined using frames according to the principle of triangulation from the image sequence and with the knowledge of the pattern of the pattern sequence captured in the respective images of the image sequence, in particular using intersection.
[7]
7. Optical measurement method according to any one of claims 1 to 6, characterized by the fact that the lighting and recording of the positions that are known to each other are carried out from the alignments that are known to each other, particularly in the which registration is performed with a plurality of cameras (4a, 4b, 4c) as part of the camera system (4) from different positions.
[8]
8. Optical measurement method according to any one of claims 1 to 7, characterized by the fact that the surface of the measurement object (1s) is lit successively with:
- stripe patterns with varying degrees of quality,
- pseudocodes and / or
- random patterns like patterns other than the pattern sequence, in particular in which the lighting is performed with the individual patterns (2a, 2b) substantially in the temporal and direct succession with a projection duration of approximately 100 and 300 ms, in a specific manner approximately 200 ms, and the image sequence is recorded with an exposure duration per image in each
Petition 870190108144, of 10/24/2019, p. 45/54
5/7 case of approximately 100 ms and 300 ms, specifically approximately 200 ms.
[9]
9. Optical measurement system (7) to determine the 3D coordinates of a multiplicity of measurement points on the surface of the measurement object (1s), which comprises,
- a projector (3) to illuminate the surface of the measurement object (1s) with the pattern sequence of different optical patterns (2a, 2b),
- a camera system (4) for recording an image sequence of a plurality of individual images of the surface of the measurement object (1s) which is illuminated with the pattern sequence, and
- an evaluation unit to determine the coordinates in 3D and determine from the particular image sequence and at the same time verify a succession of luminosity values for identical measurement points on the surface of the measurement object (1s) in the respective images of the image sequence, characterized by the fact that the inertia sensors (5a, 5b) are arranged:
- on the projector (3),
- about the camera system (4) and / or
- on the measurement object (1) to measure the translational and rotational accelerations of the projector (3), the camera system (4) and / or the measurement object (1) in at least such a measurement rate, which during the exposure time of the respective individual images of the image sequence, in particular a multiplicity of values, for the accelerations can be captured, and so that the evaluation unit (6) is configured
- to synchronously control the inertia sensors (5a, 5b) and the camera system (4) in such a way that during the
Petition 870190108144, of 10/24/2019, p. 46/54
6/7 recording of the image sequence in each case a plurality of values for the accelerations is captured at least during the exposure time of individual images of the image sequence,
- to take into account algorithmically the movements of the projector (3), the camera system (4) and / or the measurement object (1) - which cause the camera to shake and / or blur due to motion in the respective individual images of the image sequence - based on the accelerations measured by the inertia sensors (5a, 5b) for the determination of the coordinates in 3D, and being that during the entire operation of registering the image sequence or a plurality of image sequences, the Accelerations are measured and the information obtained by evaluating the individual images in relation to the 3D coordinates of the measurement points is gathered in a computational way using the measured accelerations.
[10]
10. Optical measurement system (7) according to claim 9, characterized in that the inertia sensors (5a, 5b) are combined and integrated within an inertia measurement unit - in particular based on components based in MEMS - in such a way that the inertia measurement unit is configured to measure all accelerations in six degrees of freedom, in particular at a measurement rate of approximately 50 to 2000 Hz.
[11]
11. Optical measurement system (7) according to claim 9 or 10, characterized by the fact that the evaluation unit (6) is configured in such a way that, regardless of the measured accelerations, the compensation and / or correction of camera balance and / or motion blur - which is caused by the movements of the projector (3), the camera system (4) and / or the measurement object (1) that occurs during the exposure time of the respective
Petition 870190108144, of 10/24/2019, p. 47/54
7/7 individual images of the image sequence - happens respectively in the individual images of the image sequence.
[12]
12. Optical measuring system (7) according to any one of claims 9 to 11, characterized in that the projector (3) and the camera system (4) are accommodated with a fixed and known positioning and orientation , relative to each other, physically in a common measuring head (8) of the measuring system, in which the inertia sensors (5a, 5b) are also arranged, in particular and in which the measuring head (8) is configured to be able to be held by the user's hand and / or to be attached to a robot arm.
[13]
13. Optical measurement system (7) according to any of claims 9 to 12, characterized in that the camera system (4) has at least one camera (4a, 4b, 4c), in particular in which the camera system (4) contains two, three or four cameras (4a, 4b, 4c), which are arranged with a fixed and known positioning and orientation, relative to each other, and are configured for substantially simultaneous registration individual images.
[14]
14. Optical measurement system (7) according to any of claims 9 to 13, characterized in that said measurement system is configured and designed to carry out the optical measurement method as defined in any of the claims of 1 to 8.
类似技术:
公开号 | 公开日 | 专利标题
BR112012032875B1|2020-02-04|optical measurement method and system to determine the 3D coordinates of a multiplicity of measurement points
US9628779B2|2017-04-18|Optical measurement method and measurement system for determining 3D coordinates on a measurement object surface
US9563954B2|2017-02-07|Method for capturing the three-dimensional surface geometry of objects
US10907955B2|2021-02-02|Three-dimensional imager
JP2006170688A|2006-06-29|Stereo image formation method and three-dimensional data preparation device
JPH07286837A|1995-10-31|Instrument and method for measuring rotational amount of spherical body
JP2008145139A|2008-06-26|Shape measuring device
JP2008145209A|2008-06-26|Device for measuring three-dimensional shape, and method for measuring three-dimensional shape
JP2017034576A|2017-02-09|Imaging system, imaging device and image processing apparatus
JP2009253715A|2009-10-29|Camera calibrating device, camera calibrating method, camera calibrating program, and recording medium for recording the program
ES2894935T3|2022-02-16|Three-dimensional distance measuring apparatus and method therefor
JP2010175554A|2010-08-12|Device and method for measuring three-dimensional shape
JP2004347488A|2004-12-09|Field work support device
JP2004163271A|2004-06-10|Noncontact image measuring apparatus
JP4797109B2|2011-10-19|Three-dimensional shape measuring apparatus and three-dimensional shape measuring method
JP2017207477A|2017-11-24|Precise hand-held scanner
JP2022505166A|2022-01-14|3D sensor with facing channels
US20190096088A1|2019-03-28|Global positioning of a sensor with respect to different tiles for a global three-dimensional surface reconstruction
CN111091599A|2020-05-01|Multi-camera-projector system calibration method based on sphere calibration object
KR101165340B1|2012-07-18|Projector calibration system using camera and method using this
Huang2016|Improving the performance of fringe projection profilometry by accurate geometric modelling and system calibration
CN111971523A|2020-11-20|Vision sensor system, control method, and program
Tao et al.2006|An accurate three-dimensional scanning system with a new phase error compensation method
Xu et al.1999|A visual processing system for facial prediction
Marugame et al.2002|A fast and precise system for taking high-density human head measurements with surrounding range finders
同族专利:
公开号 | 公开日
AU2011269189A1|2012-11-22|
KR20130032351A|2013-04-01|
AU2011269189B2|2013-09-26|
CA2801595C|2015-10-27|
JP5680196B2|2015-03-04|
EP2583055A1|2013-04-24|
EP2583055B1|2020-09-23|
CA2801595A1|2011-12-29|
US9683837B2|2017-06-20|
US20130100282A1|2013-04-25|
WO2011160962A1|2011-12-29|
CN102947670A|2013-02-27|
BR112012032875A2|2016-11-08|
KR101458991B1|2014-11-12|
JP2013531237A|2013-08-01|
EP2400261A1|2011-12-28|
CN102947670B|2015-11-25|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

GB2292605B|1994-08-24|1998-04-08|Guy Richard John Fowler|Scanning arrangement and method|
DE19633686C2|1996-08-12|1998-08-20|Fraunhofer Ges Forschung|Device and method for measuring distances and / or spatial coordinates of objects and / or their change over time|
DE19942900B4|1998-09-08|2004-01-22|Ricoh Company, Ltd.|Device for correcting image errors caused by camera shake|
JP4010753B2|2000-08-08|2007-11-21|株式会社リコー|Shape measuring system, imaging device, shape measuring method, and recording medium|
DE10127304C5|2001-06-06|2007-07-19|Technische Universität Carolo-Wilhelmina Zu Braunschweig|Method and device for determining the three-dimensional contour of a specular surface of an object|
EP1606576A4|2003-03-24|2006-11-22|D3D L P|Laser digitizer system for dental applications|
JP4480488B2|2003-08-28|2010-06-16|富士通株式会社|Measuring device, computer numerical control device, and program|
US8239162B2|2006-04-13|2012-08-07|Tanenhaus & Associates, Inc.|Miniaturized inertial measurement unit and associated methods|
JP4800163B2|2006-09-29|2011-10-26|株式会社トプコン|Position measuring apparatus and method|
DE102006049695A1|2006-10-16|2008-04-24|Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V.|Device and method for contactless detection of a three-dimensional contour|
DE102007060263A1|2007-08-16|2009-02-26|Steinbichler Optotechnik Gmbh|Scanner for scanning e.g. teeth, in mouth of patient, has image optics arranged at distance to each other, where distance of optics and directions of optical axes are selected such that optics and axes are oriented to common area of tooth|
EP2026034B1|2007-08-16|2020-04-29|Carl Zeiss Optotechnik GmbH|Device for determining the 3D coordinates of an object, in particular a tooth|
AT506110B1|2007-12-12|2011-08-15|Nextsense Mess Und Pruefsysteme Gmbh|DEVICE AND METHOD FOR DETECTING BODY MEASURE DATA AND CONTOUR DATA|
DE102008036710A1|2008-08-07|2010-02-11|GOM - Gesellschaft für Optische Meßtechnik mbH|Method for three-dimensional optical measuring of object with topometric measuring process, involves selecting partial areas of objects such that reflections of patterns on surface of object do not influence patterns in partial areas|
DE102008047816B4|2008-09-18|2011-08-25|Steinbichler Optotechnik GmbH, 83115|Device for determining the 3D coordinates of an object, in particular a tooth|
DK2438397T3|2009-06-01|2019-01-28|Dentsply Sirona Inc|Method and device for three-dimensional surface detection with a dynamic frame of reference|
CN101697233B|2009-10-16|2012-06-06|长春理工大学|Structured light-based three-dimensional object surface reconstruction method|DE102006031580A1|2006-07-03|2008-01-17|Faro Technologies, Inc., Lake Mary|Method and device for the three-dimensional detection of a spatial area|
US9482755B2|2008-11-17|2016-11-01|Faro Technologies, Inc.|Measurement system having air temperature compensation between atarget and a laser tracker|
DE102009015920B4|2009-03-25|2014-11-20|Faro Technologies, Inc.|Device for optically scanning and measuring an environment|
US9551575B2|2009-03-25|2017-01-24|Faro Technologies, Inc.|Laser scanner having a multi-color light source and real-time color receiver|
DE102009057101A1|2009-11-20|2011-05-26|Faro Technologies, Inc., Lake Mary|Device for optically scanning and measuring an environment|
US8630314B2|2010-01-11|2014-01-14|Faro Technologies, Inc.|Method and apparatus for synchronizing measurements taken by multiple metrology devices|
GB2489135B|2010-01-20|2013-10-09|Faro Tech Inc|Display for coordinate measuring machine|
US8898919B2|2010-01-20|2014-12-02|Faro Technologies, Inc.|Coordinate measurement machine with distance meter used to establish frame of reference|
WO2012033892A1|2010-09-08|2012-03-15|Faro Technologies, Inc.|A laser scanner or laser tracker having a projector|
US8533967B2|2010-01-20|2013-09-17|Faro Technologies, Inc.|Coordinate measurement machines with removable accessories|
US9879976B2|2010-01-20|2018-01-30|Faro Technologies, Inc.|Articulated arm coordinate measurement machine that uses a 2D camera to determine 3D coordinates of smoothly continuous edge features|
US8875409B2|2010-01-20|2014-11-04|Faro Technologies, Inc.|Coordinate measurement machines with removable accessories|
US8832954B2|2010-01-20|2014-09-16|Faro Technologies, Inc.|Coordinate measurement machines with removable accessories|
US9628775B2|2010-01-20|2017-04-18|Faro Technologies, Inc.|Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations|
US8615893B2|2010-01-20|2013-12-31|Faro Technologies, Inc.|Portable articulated arm coordinate measuring machine having integrated software controls|
US8677643B2|2010-01-20|2014-03-25|Faro Technologies, Inc.|Coordinate measurement machines with removable accessories|
WO2013184340A1|2012-06-07|2013-12-12|Faro Technologies, Inc.|Coordinate measurement machines with removable accessories|
US9607239B2|2010-01-20|2017-03-28|Faro Technologies, Inc.|Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations|
US9772394B2|2010-04-21|2017-09-26|Faro Technologies, Inc.|Method and apparatus for following an operator and locking onto a retroreflector with a laser tracker|
US9377885B2|2010-04-21|2016-06-28|Faro Technologies, Inc.|Method and apparatus for locking onto a retroreflector with a laser tracker|
US9400170B2|2010-04-21|2016-07-26|Faro Technologies, Inc.|Automatic measurement of dimensional data within an acceptance region by a laser tracker|
DE102010020925B4|2010-05-10|2014-02-27|Faro Technologies, Inc.|Method for optically scanning and measuring an environment|
US9168654B2|2010-11-16|2015-10-27|Faro Technologies, Inc.|Coordinate measuring machines with dual layer arm|
US8607536B2|2011-01-14|2013-12-17|Faro Technologies, Inc.|Case for a device|
GB2518998A|2011-03-03|2015-04-08|Faro Tech Inc|Target apparatus and method|
US8619265B2|2011-03-14|2013-12-31|Faro Technologies, Inc.|Automatic measurement of dimensional data with a laser tracker|
EP2511659A1|2011-04-14|2012-10-17|Hexagon Technology Center GmbH|Geodesic marking system for marking target points|
EP2511656A1|2011-04-14|2012-10-17|Hexagon Technology Center GmbH|Measuring system for determining the 3D coordinates of an object surface|
US9686532B2|2011-04-15|2017-06-20|Faro Technologies, Inc.|System and method of acquiring three-dimensional coordinates using multiple coordinate measurement devices|
US9164173B2|2011-04-15|2015-10-20|Faro Technologies, Inc.|Laser tracker that uses a fiber-optic coupler and an achromatic launch to align and collimate two wavelengths of light|
US9482529B2|2011-04-15|2016-11-01|Faro Technologies, Inc.|Three-dimensional coordinate scanner and method of operation|
WO2012141868A1|2011-04-15|2012-10-18|Faro Technologies, Inc.|Enhanced position detector in laser tracker|
EP2527784A1|2011-05-19|2012-11-28|Hexagon Technology Center GmbH|Optical measurement method and system for determining 3D coordinates of a measured object surface|
US8922647B2|2011-08-03|2014-12-30|The Boeing Company|Projection aided feature measurement using uncalibrated camera|
DE102011114674C5|2011-09-30|2020-05-28|Steinbichler Optotechnik Gmbh|Method and device for determining the 3D coordinates of an object|
US8934010B2|2011-11-21|2015-01-13|Shenzhen China Star Optoelectronics Technology Co., Ltd.|Method for smear measurement of display device and device for smear measurement of display device|
DE102012100609A1|2012-01-25|2013-07-25|Faro Technologies, Inc.|Device for optically scanning and measuring an environment|
WO2013112455A1|2012-01-27|2013-08-01|Faro Technologies, Inc.|Inspection method with barcode identification|
DE102012100953B4|2012-02-06|2020-01-09|A.Tron3D Gmbh|Device for detecting the three-dimensional geometry of objects and method for operating the same|
JP2013234951A|2012-05-10|2013-11-21|Mitsutoyo Corp|Three-dimensional measuring apparatus|
US8937657B2|2012-07-15|2015-01-20|Erik Klass|Portable three-dimensional metrology with data displayed on the measured surface|
US8997362B2|2012-07-17|2015-04-07|Faro Technologies, Inc.|Portable articulated arm coordinate measuring machine with optical communications bus|
US9513107B2|2012-10-05|2016-12-06|Faro Technologies, Inc.|Registration calculation between three-dimensionalscans based on two-dimensionalscan data from a 3D scanner|
DE102012109481A1|2012-10-05|2014-04-10|Faro Technologies, Inc.|Device for optically scanning and measuring an environment|
US10067231B2|2012-10-05|2018-09-04|Faro Technologies, Inc.|Registration calculation of three-dimensional scanner data performed between scans based on measurements by two-dimensional scanner|
EP2918967B1|2012-11-07|2018-05-16|Artec Europe S.a.r.l.|Method for monitoring linear dimensions of three-dimensional objects|
DE102012112321B4|2012-12-14|2015-03-05|Faro Technologies, Inc.|Device for optically scanning and measuring an environment|
DE202012104890U1|2012-12-14|2013-03-05|Faro Technologies, Inc.|Device for optically scanning and measuring an environment|
DE102012112322B4|2012-12-14|2015-11-05|Faro Technologies, Inc.|Method for optically scanning and measuring an environment|
US9041914B2|2013-03-15|2015-05-26|Faro Technologies, Inc.|Three-dimensional coordinate scanner and method of operation|
US20140307055A1|2013-04-15|2014-10-16|Microsoft Corporation|Intensity-modulated light pattern for active stereo|
US9405972B2|2013-09-27|2016-08-02|Qualcomm Incorporated|Exterior hybrid photo mapping|
US9233469B2|2014-02-13|2016-01-12|GM Global Technology Operations LLC|Robotic system with 3D box location functionality|
JP6317618B2|2014-05-01|2018-04-25|キヤノン株式会社|Information processing apparatus and method, measuring apparatus, and working apparatus|
FR3021784B1|2014-05-27|2017-10-13|European Aeronautic Defence & Space Co Eads France|METHOD FOR PROJECTING VIRTUAL DATA AND DEVICE FOR SAID PROJECTION|
US9395174B2|2014-06-27|2016-07-19|Faro Technologies, Inc.|Determining retroreflector orientation by optimizing spatial fit|
DE102014216390A1|2014-08-19|2016-02-25|Siemens Aktiengesellschaft|projector|
RU2573053C1|2014-09-10|2016-01-20|Самсунг Электроникс Ко., Лтд.|Laser speckle interferometric systems and methods for mobile devices|
DE102014019053A1|2014-12-18|2016-06-23|Testo Ag|Method and device for determining a dimension of an object|
JP6602867B2|2014-12-22|2019-11-06|サイバーオプティクスコーポレーション|How to update the calibration of a 3D measurement system|
JP6101944B2|2014-12-25|2017-03-29|パナソニックIpマネジメント株式会社|Projection device|
DE102014019670B3|2014-12-30|2016-06-30|Faro Technologies, Inc.|Method for optically scanning and measuring an environment with a 3D measuring device and auto-calibration by means of redundancies|
JP6168577B2|2014-12-31|2017-07-26|エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd|System and method for adjusting a reference line of an imaging system having a microlens array|
CN107251538A|2015-02-02|2017-10-13|Ocr系统株式会社|Optic terminal device and scanning imaging system|
DE102015204474B4|2015-03-12|2016-10-13|Hans-Günter Vosseler|Device and method for contactless three-dimensional measurement of components|
USD754130S1|2015-03-19|2016-04-19|Faro Technologies, Inc.|Optical scanner|
US9964402B2|2015-04-24|2018-05-08|Faro Technologies, Inc.|Two-camera triangulation scanner with detachable coupling mechanism|
DE102015122844A1|2015-12-27|2017-06-29|Faro Technologies, Inc.|3D measuring device with battery pack|
KR101687307B1|2015-12-29|2016-12-16|인천대학교 산학협력단|Method for wireless digital optical communications based on projector-projection surface-camera|
DE102016002398B4|2016-02-26|2019-04-25|Gerd Häusler|Optical 3D sensor for fast and dense shape detection|
KR102207725B1|2016-04-08|2021-01-26|샤이닝 쓰리디 테크 컴퍼니., 리미티드.|3D survey system and survey method in multiple survey mode|
US10659764B2|2016-06-20|2020-05-19|Intel Corporation|Depth image provision apparatus and method|
CN113160282A|2016-06-20|2021-07-23|康耐视公司|Method and device for three-dimensional measurement of moving objects|
US10609359B2|2016-06-22|2020-03-31|Intel Corporation|Depth image provision apparatus and method|
US11060853B2|2016-09-14|2021-07-13|ScantechCo., Ltd.|Three-dimensional sensor system and three-dimensional data acquisition method|
CN106403845B|2016-09-14|2017-10-03|杭州思看科技有限公司|Three-dimension sensor system and three-dimensional data acquisition methods|
EP3367053B1|2017-02-27|2021-02-17|Kulzer GmbH|3d scanner with gyroscope sensor|
CN107202554B|2017-07-06|2018-07-06|杭州思看科技有限公司|It is provided simultaneously with photogrammetric and 3-D scanning function hand-held large scale three-dimensional measurement beam scanner system|
DE102017222708A1|2017-12-14|2019-06-19|Conti Temic Microelectronic Gmbh|3D environment detection via projector and camera modules|
WO2019116401A1|2017-12-15|2019-06-20|Camoga S.P.A.|Three-dimensional mapping machine for leather or similar material covered objects|
KR102068309B1|2017-12-26|2020-01-20|경북대학교 산학협력단|3d scanning apparatus and 3d scanning method|
JP2019194540A|2018-05-02|2019-11-07|オムロン株式会社|Three-dimensional shape measuring system and measurement time setting method|
US10477180B1|2018-05-22|2019-11-12|Faro Technologies, Inc.|Photogrammetry system and method of operation|
KR20200005332A|2018-07-06|2020-01-15|삼성전자주식회사|Calibration device and method of operation thereof|
WO2021038276A1|2019-08-28|2021-03-04|Elasizer S.R.L.|Method and system for an automatic selection of items of equipment|
US11132804B2|2020-01-07|2021-09-28|Himax Technologies Limited|Hybrid depth estimation system|
法律状态:
2018-12-26| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]|
2019-09-10| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2019-12-03| B09A| Decision: intention to grant [chapter 9.1 patent gazette]|
2020-02-04| B16A| Patent or certificate of addition of invention granted|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 09/06/2011, OBSERVADAS AS CONDICOES LEGAIS. |
2021-04-13| B21F| Lapse acc. art. 78, item iv - on non-payment of the annual fees in time|Free format text: REFERENTE A 10A ANUIDADE. |
2021-08-10| B24J| Lapse because of non-payment of annual fees (definitively: art 78 iv lpi, resolution 113/2013 art. 12)|Free format text: EM VIRTUDE DA EXTINCAO PUBLICADA NA RPI 2623 DE 13-04-2021 E CONSIDERANDO AUSENCIA DE MANIFESTACAO DENTRO DOS PRAZOS LEGAIS, INFORMO QUE CABE SER MANTIDA A EXTINCAO DA PATENTE E SEUS CERTIFICADOS, CONFORME O DISPOSTO NO ARTIGO 12, DA RESOLUCAO 113/2013. |
优先权:
申请号 | 申请日 | 专利标题
EP10166672A|EP2400261A1|2010-06-21|2010-06-21|Optical measurement method and system for determining 3D coordination in a measuring object surface|
PCT/EP2011/059641|WO2011160962A1|2010-06-21|2011-06-09|Optical measurement method and measurement system for determining 3d coordinates on a measurement object surface|
[返回顶部]